Theoretical properties of the global optimizer of two layer neural network

نویسندگان

  • Digvijay Boob
  • Guanghui Lan
چکیده

In this paper, we study the problem of optimizing a two-layer artificial neural network that best fits a training dataset. We look at this problem in the setting where the number of parameters is greater than the number of sampled points. We show that for a wide class of differentiable activation functions (this class involves “almost” all functions which are not piecewise linear), we have that first-order optimal solutions satisfy global optimality provided the hidden layer is non-singular. Our results are easily extended to hidden layers given by a flat matrix from that of a square matrix. Results are applicable even if network has more than one hidden layer provided all hidden layers satisfy non-singularity, all activations are from the given “good” class of differentiable functions and optimization is only with respect to the last hidden layer. We also study the smoothness properties of the objective function and show that it is actually Lipschitz smooth, i.e., its gradients do not change sharply. We use smoothness properties to guarantee asymptotic convergence of O(1/number of iterations) to a first-order optimal solution. We also show that our algorithm will maintain non-singularity of hidden layer for any finite number of iterations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Recurrent Neural Network Model for solving CCR Model in Data Envelopment Analysis

In this paper, we present a recurrent neural network model for solving CCR Model in Data Envelopment Analysis (DEA). The proposed neural network model is derived from an unconstrained minimization problem. In the theoretical aspect, it is shown that the proposed neural network is stable in the sense of Lyapunov and globally convergent to the optimal solution of CCR model. The proposed model has...

متن کامل

A Recurrent Neural Network to Identify Efficient Decision Making Units in Data Envelopment Analysis

In this paper we present a recurrent neural network model to recognize efficient Decision Making Units(DMUs) in Data Envelopment Analysis(DEA). The proposed neural network model is derived from an unconstrained minimization problem. In theoretical aspect, it is shown that the proposed neural network is stable in the sense of lyapunov and globally convergent. The proposed model has a single-laye...

متن کامل

Comparing Two Methods of Neural Networks to Evaluate Dead Oil Viscosity

Reservoir characterization and asset management require comprehensive information about formation fluids. In fact, it is not possible to find accurate solutions to many petroleum engineering problems without having accurate pressure-volume-temperature (PVT) data. Traditionally, fluid information has been obtained by capturing samples and then by measuring the PVT properties in a laboratory. In ...

متن کامل

ar X iv : 1 71 0 . 11 24 1 v 1 [ cs . L G ] 3 0 O ct 2 01 7 Theoretical properties of the global optimizer of two layer neural network

In this paper, we study the problem of optimizing a two-layer artificial neural network that best fits a training dataset. We look at this problem in the setting where the number of parameters is greater than the number of sampled points. We show that for a wide class of differentiable activation functions (this class involves “almost” all functions which are not piecewise linear), we have that...

متن کامل

Solving the local positioning problem using a four-layer artificial neural network

Today, the global positioning systems (GPS) do not work well in buildings and in dense urban areas when there is no lines of sight between the user and their satellites. Hence, the local positioning system (LPS) has been considerably used in recent years. The main purpose of this research is to provide a four-layer artificial neural network based on nonlinear system solver (NLANN) for local pos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1710.11241  شماره 

صفحات  -

تاریخ انتشار 2017